217 research outputs found

    Performance Analysis of the ARIA Adaptive Media Processing Workflows using Colored Petri Nets

    Get PDF
    AbstractMultimedia systems are one of the most complex and interesting applications that are nowadays proposed to the users. Their complexity derives mainly from the fact that multimedia systems have to process huge amounts of data, while respecting real-time deadlines. For this reason performance evaluation of the underlaying workflow is a key issue in the design process of a new Multimedia system.In this paper we consider the ARchitecture for Interactive Arts (ARIA), an adaptive media processing workflow, developed at the Arizona State University, and outline a semi-automatic procedure to translate its specification into Colored Petri Nets. We then provide guidelines on how to compute the parameters for the performance models, and apply the proposed methodology to a realistic example of a face recognition application

    Fragile Watermarking of 3D Models in Transformed Domain

    Get PDF
    This paper presents an algorithm aimed at the integrity protection of 3D models represented as a set of vertices and polygons. The proposed method defines a procedure to perform a fragile watermarking of the vertices’ data, namely 3D coordinates and polygons, introducing a very small error in the vertices’ coordinates. The watermark bit string is embedded into a secret vector space defined by the Karhunen–Loève transform derived from a key image. Experimental results show the good performance of the method and its security

    Fluid Petri Nets for the Performance Evaluation of MapReduce Applications

    Get PDF
    Big Data applications allow to successfully analyze large amounts of data not necessarily structured, though at the same time they present new challenges. For example, predicting the performance of frameworks such as Hadoop can be a costly task, hence the necessity to provide models that can be a valuable support for designers and developers. This paper provides a new contribution in studying a novel modeling approach based on fluid Petri nets to predict MapReduce jobs execution time. The experiments we performed at CINECA, the Italian supercomputing center, have shown that the achieved accuracy is within 16% of the actual measurements on average

    The Novel hDHODH Inhibitor MEDS433 Prevents Influenza Virus Replication by Blocking Pyrimidine Biosynthesis

    Get PDF
    The pharmacological management of influenza virus (IV) infections still poses a series of challenges due to the limited anti-IV drug arsenal. Therefore, the development of new anti-influenza agents effective against antigenically different IVs is therefore an urgent priority. To meet this need, host-targeting antivirals (HTAs) can be evaluated as an alternative or complementary approach to current direct-acting agents (DAAs) for the therapy of IV infections. As a contribution to this antiviral strategy, in this study, we characterized the anti-IV activity of MEDS433, a novel small molecule inhibitor of the human dihydroorotate dehydrogenase (hDHODH), a key cellular enzyme of the de novo pyrimidine biosynthesis pathway. MEDS433 exhibited a potent antiviral activity against IAV and IBV replication, which was reversed by the addition of exogenous uridine and cytidine or the hDHODH product orotate, thus indicating that MEDS433 targets notably hDHODH activity in IV-infected cells. When MEDS433 was used in combination either with dipyridamole (DPY), an inhibitor of the pyrimidine salvage pathway, or with an anti-IV DAA, such as N(4)-hydroxycytidine (NHC), synergistic anti-IV activities were observed. As a whole, these results indicate MEDS433 as a potential HTA candidate to develop novel anti-IV intervention approaches, either as a single agent or in combination regimens with DAAs

    Multi-class queuing networks models for energy optimization

    Get PDF
    The increase of energy consumption and the related costs in large data centers has stimulated new researches on techniques to optimize the power consumption of the servers. In this paper we focus on systems that should process a peak workload consisting of different classes of applications. The objective is to implement a policy of load control which allows an efficient use of the power deployed to the resources. The proposed strategy controls the workload mix in order to achieve the maximum utilization of all the resources allocated. As a consequence, the power provision will be fully utilized and the throughput maximized. Thus, the costs to execute a given workload will be minimized, together with its energy consumption, since the required processing time is decreased
    • …
    corecore